Normalized information-based divergences

نویسندگان

  • Jean-François Coeurjolly
  • Rémy Drouilhet
  • Jean-François Robineau
چکیده

This paper is devoted to the mathematical study of some divergences based on the mutual information well-suited to categorical random vectors. These divergences are generalizations of the " entropy distance " and " information distance ". Their main characteristic is that they combine a complexity term and the mutual information. We then introduce the notion of (normalized) information-based divergence, propose several examples and discuss their mathematical properties.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust Covariance Estimators Based on Information Divergences and Riemannian Manifold

This paper proposes a class of covariance estimators based on information divergences in heterogeneous environments. In particular, the problem of covariance estimation is reformulated on the Riemannian manifold of Hermitian positive-definite (HPD) matrices. The means associated with information divergences are derived and used as the estimators. Without resorting to the complete knowledge of t...

متن کامل

Wyner's Common Information under Rényi Divergence Measures

We study a generalized version of Wyner’s common information problem (also coined the distributed sources simulation problem). The original common information problem consists in understanding the minimum rate of the common input to independent processors to generate an approximation of a joint distribution when the distance measure used to quantify the discrepancy between the synthesized and t...

متن کامل

On Hölder Projective Divergences

We describe a framework to build distances by measuring the tightness of inequalities, and introduce the notion of proper statistical divergences and improper pseudo-divergences. We then consider the Hölder ordinary and reverse inequalities, and present two novel classes of Hölder divergences and pseudo-divergences that both encapsulate the special case of the Cauchy-Schwarz divergence. We repo...

متن کامل

Divergence-based classification in learning vector quantization

We discuss the use of divergences in dissimilarity based classification. Divergences can be employed whenever vectorial data consists of non-negative, potentially normalized features. This is, for instance, the case in spectral data or histograms. In particular, we introduce and study Divergence Based Learning Vector Quantization (DLVQ). We derive cost function based DLVQ schemes for the family...

متن کامل

Normalized Model of Traffic Light Traits Based on Colored Pixels

Nowadays, because of the growing numbers of vehicles on streets and roads, the use of intelligent controlsystems to improve driving safety and health has become a necessity. To design and implement suchcontrol systems, having information about traffic light colors is essential. There are the wide variety oftraffic lights in terms of light intensity and color. Therefore it seems that design and ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Probl. Inf. Transm.

دوره 43  شماره 

صفحات  -

تاریخ انتشار 2007